You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/10/01 02:30:29 UTC

Build failed in Jenkins: beam_PerformanceTests_JDBC #6985

See <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6985/display/redirect?page=changes>

Changes:

[noreply] Support VR test including TestStream for Spark runner in streaming mode

[noreply] Add cron job to trigger Java JMH micro-benchmarks weekly  (#23388)


------------------------------------------
[...truncated 424.70 KB...]
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.441Z: Fusing consumer JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.467Z: Fusing consumer JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.496Z: Fusing consumer JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.533Z: Fusing consumer JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.560Z: Fusing consumer JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Values/Values/Map into JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.591Z: Fusing consumer ParDo(TimeMonitor) into JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Reshuffle.ViaRandomKey/Values/Values/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.623Z: Fusing consumer Count All/Combine.perKey(Count)/GroupByKey+Count All/Combine.perKey(Count)/Combine.GroupedValues/Partial into Count All/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.656Z: Fusing consumer Count All/Combine.perKey(Count)/GroupByKey/Reify into Count All/Combine.perKey(Count)/GroupByKey+Count All/Combine.perKey(Count)/Combine.GroupedValues/Partial
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.680Z: Fusing consumer Hash row contents/WithKeys/AddKeys/Map into ParDo(SelectName)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.714Z: Fusing consumer Hash row contents/Combine.perKey(Hashing)/GroupByKey+Hash row contents/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Hash row contents/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.748Z: Fusing consumer Hash row contents/Combine.perKey(Hashing)/GroupByKey/Reify into Hash row contents/Combine.perKey(Hashing)/GroupByKey+Hash row contents/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.785Z: Fusing consumer Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/GroupByKey+Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/Combine.GroupedValues/Partial into Combine.globally(Top(Reversed))/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.815Z: Fusing consumer Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/GroupByKey/Reify into Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/GroupByKey+Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/Combine.GroupedValues/Partial
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.850Z: Fusing consumer Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/GroupByKey+Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/Combine.GroupedValues/Partial into Combine.globally(Top(Natural))/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.880Z: Fusing consumer Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/GroupByKey/Reify into Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/GroupByKey+Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/Combine.GroupedValues/Partial
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.905Z: Fusing consumer Count All/Combine.perKey(Count)/GroupByKey/Write into Count All/Combine.perKey(Count)/GroupByKey/Reify
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.939Z: Fusing consumer Count All/Combine.perKey(Count)/Combine.GroupedValues into Count All/Combine.perKey(Count)/GroupByKey/Read
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:43.975Z: Fusing consumer Count All/Combine.perKey(Count)/Combine.GroupedValues/Extract into Count All/Combine.perKey(Count)/Combine.GroupedValues
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.010Z: Fusing consumer Count All/Values/Values/Map into Count All/Combine.perKey(Count)/Combine.GroupedValues/Extract
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.046Z: Fusing consumer Count All/ProduceDefault into Count All/CreateVoid/Read(CreateSource)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.078Z: Fusing consumer Hash row contents/Combine.perKey(Hashing)/GroupByKey/Write into Hash row contents/Combine.perKey(Hashing)/GroupByKey/Reify
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.111Z: Fusing consumer Hash row contents/Combine.perKey(Hashing)/Combine.GroupedValues into Hash row contents/Combine.perKey(Hashing)/GroupByKey/Read
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.137Z: Fusing consumer Hash row contents/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Hash row contents/Combine.perKey(Hashing)/Combine.GroupedValues
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.174Z: Fusing consumer Hash row contents/Values/Values/Map into Hash row contents/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.203Z: Fusing consumer PAssert$2/GroupGlobally/Reify.Window/ParDo(Anonymous) into Hash row contents/Values/Values/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.240Z: Fusing consumer PAssert$2/GroupGlobally/ParDo(ToSingletonIterables) into PAssert$2/GroupGlobally/Reify.Window/ParDo(Anonymous)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.269Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.277Z: Fusing consumer PAssert$2/GroupGlobally/GroupByKey/Write into PAssert$2/GroupGlobally/GroupByKey/Reify
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.312Z: Fusing consumer PAssert$2/GroupGlobally/GroupByKey/GroupByWindow into PAssert$2/GroupGlobally/GroupByKey/Read
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.347Z: Fusing consumer PAssert$2/GroupGlobally/Values/Values/Map into PAssert$2/GroupGlobally/GroupByKey/GroupByWindow
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.384Z: Fusing consumer PAssert$2/GroupGlobally/ParDo(Concat) into PAssert$2/GroupGlobally/Values/Values/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.420Z: Fusing consumer PAssert$2/GetPane/Map into PAssert$2/GroupGlobally/ParDo(Concat)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.450Z: Fusing consumer PAssert$2/RunChecks into PAssert$2/GetPane/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.476Z: Fusing consumer PAssert$2/VerifyAssertions/ParDo(DefaultConclude) into PAssert$2/RunChecks
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.505Z: Fusing consumer Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/GroupByKey/Write into Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/GroupByKey/Reify
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.537Z: Fusing consumer Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/Combine.GroupedValues into Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/GroupByKey/Read
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.562Z: Fusing consumer Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/Combine.GroupedValues/Extract into Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/Combine.GroupedValues
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.588Z: Fusing consumer Combine.globally(Top(Reversed))/Values/Values/Map into Combine.globally(Top(Reversed))/Combine.perKey(Top(Reversed))/Combine.GroupedValues/Extract
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.620Z: Fusing consumer Combine.globally(Top(Reversed))/ProduceDefault into Combine.globally(Top(Reversed))/CreateVoid/Read(CreateSource)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.655Z: Fusing consumer Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/GroupByKey/Write into Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/GroupByKey/Reify
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.689Z: Fusing consumer Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/Combine.GroupedValues into Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/GroupByKey/Read
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.723Z: Fusing consumer Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/Combine.GroupedValues/Extract into Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/Combine.GroupedValues
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.755Z: Fusing consumer Combine.globally(Top(Natural))/Values/Values/Map into Combine.globally(Top(Natural))/Combine.perKey(Top(Natural))/Combine.GroupedValues/Extract
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.780Z: Fusing consumer Combine.globally(Top(Natural))/ProduceDefault into Combine.globally(Top(Natural))/CreateVoid/Read(CreateSource)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.809Z: Unzipping flatten s42-u187 for input s44.org.apache.beam.sdk.values.PCollection.<init>:405#9d175ab169cd3bcf-c185
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.833Z: Fusing unzipped copy of PAssert$2/GroupGlobally/GroupByKey/Reify, through flatten PAssert$2/GroupGlobally/Flatten.PCollections/Unzipped-1, into producer PAssert$2/GroupGlobally/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.859Z: Fusing consumer PAssert$1/GroupGlobally/WithKeys/AddKeys/Map into PAssert$1/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 01, 2022 2:25:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.881Z: Fusing consumer PAssert$3/GroupGlobally/WithKeys/AddKeys/Map into PAssert$3/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.906Z: Fusing consumer PAssert$4/GroupGlobally/WithKeys/AddKeys/Map into PAssert$4/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.936Z: Fusing consumer PAssert$2/GroupGlobally/WithKeys/AddKeys/Map into PAssert$2/GroupGlobally/Create.Values/Read(CreateSource)
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:44.964Z: Fusing consumer PAssert$1/GroupGlobally/GroupByKey/Reify into PAssert$1/GroupGlobally/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:45.002Z: Fusing consumer PAssert$3/GroupGlobally/GroupByKey/Reify into PAssert$3/GroupGlobally/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:45.036Z: Fusing consumer PAssert$4/GroupGlobally/GroupByKey/Reify into PAssert$4/GroupGlobally/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:45.073Z: Fusing consumer PAssert$2/GroupGlobally/GroupByKey/Reify into PAssert$2/GroupGlobally/WithKeys/AddKeys/Map
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:45.492Z: Executing operation JdbcIO.Read/Create.Values/Read(CreateSource)+JdbcIO.Read/JdbcIO.ReadAll/ParDo(Read)+JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)+JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
    Oct 01, 2022 2:25:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:25:45.579Z: Starting 5 ****s in us-central1-a...
    Oct 01, 2022 2:26:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:26:27.739Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2022 2:26:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:26:58.651Z: Workers have started successfully.
    Oct 01, 2022 2:27:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2022-10-01T02:27:20.760Z: org.postgresql.util.PSQLException: ERROR: relation "beamtest_it_2022_10_01_02_08_44_863" does not exist
      Position: 21
    	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
    	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
    	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
    	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
    	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
    	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:164)
    	at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:114)
    	at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1381)

    Oct 01, 2022 2:27:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2022-10-01T02:27:22.282Z: org.postgresql.util.PSQLException: ERROR: relation "beamtest_it_2022_10_01_02_08_44_863" does not exist
      Position: 21
    	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
    	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
    	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
    	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
    	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
    	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:164)
    	at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:114)
    	at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1381)

    Oct 01, 2022 2:27:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2022-10-01T02:27:26.482Z: org.postgresql.util.PSQLException: ERROR: relation "beamtest_it_2022_10_01_02_08_44_863" does not exist
      Position: 21
    	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
    	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
    	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
    	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
    	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
    	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:164)
    	at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:114)
    	at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1381)

    Oct 01, 2022 2:27:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2022-10-01T02:27:26.976Z: org.postgresql.util.PSQLException: ERROR: relation "beamtest_it_2022_10_01_02_08_44_863" does not exist
      Position: 21
    	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
    	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
    	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
    	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
    	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
    	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:164)
    	at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:114)
    	at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1381)

    Oct 01, 2022 2:27:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:27:27.338Z: Finished operation JdbcIO.Read/Create.Values/Read(CreateSource)+JdbcIO.Read/JdbcIO.ReadAll/ParDo(Read)+JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)+JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
    Oct 01, 2022 2:27:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2022-10-01T02:27:27.411Z: Workflow failed. Causes: S01:JdbcIO.Read/Create.Values/Read(CreateSource)+JdbcIO.Read/JdbcIO.ReadAll/ParDo(Read)+JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)+JdbcIO.Read/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these ****s: 

      jdbcioit0testwritethenrea-09301925-a6gz-harness-x1bh
          Root cause: Work item failed.,

      jdbcioit0testwritethenrea-09301925-a6gz-harness-x1bh
          Root cause: Work item failed.,

      jdbcioit0testwritethenrea-09301925-a6gz-harness-tjqh
          Root cause: Work item failed.,

      jdbcioit0testwritethenrea-09301925-a6gz-harness-tjqh
          Root cause: Work item failed.
    Oct 01, 2022 2:27:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:27:27.492Z: Cleaning up.
    Oct 01, 2022 2:27:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:27:27.558Z: Stopping **** pool...
    Oct 01, 2022 2:29:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:29:48.599Z: Autoscaling: Resized **** pool from 5 to 0.
    Oct 01, 2022 2:29:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-10-01T02:29:48.675Z: Worker pool stopped.
    Oct 01, 2022 2:30:25 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2022-09-30_19_25_27-1280563677383239411 failed with status FAILED.

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_OUT
    Load test results for test (ID): 21845f98-0c8b-434b-9a24-28ca3fa963ab and timestamp: 2022-10-01T02:30:25.591000000Z:
                     Metric:                    Value:
                  write_time                    39.397
    Load test results for test (ID): 21845f98-0c8b-434b-9a24-28ca3fa963ab and timestamp: 2022-10-01T02:30:25.591000000Z:
                     Metric:                    Value:
                   read_time                       0.0

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:jdbc:integrationTest FAILED

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
    org.postgresql.util.PSQLException: ERROR: table "beamtest_it_2022_10_01_02_08_44_863" does not exist
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
        at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
        at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
        at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:473)
        at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:393)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:322)
        at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:308)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:284)
        at org.postgresql.jdbc.PgStatement.executeUpdate(PgStatement.java:258)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.deleteTable(DatabaseTestHelper.java:107)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:147)

3 tests completed, 1 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>
:sdks:java:io:jdbc:integrationTest (Thread[Execution **** Thread 6,5,main]) completed. Took 21 mins 46.288 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 15s
135 actionable tasks: 79 executed, 54 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qvfe4i2vkxfog

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_JDBC #6987

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6987/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_JDBC #6986

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6986/display/redirect>

Changes:


------------------------------------------
[...truncated 323.39 KB...]
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is cf939a8e73bacd4cee6ced7e1c9ce8c2
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key cf939a8e73bacd4cee6ced7e1c9ce8c2
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar (Thread[Execution **** Thread 5,5,main]) completed. Took 0.378 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****:legacy-****) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 131209ca67d467a7040c775282cdee1a
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key 131209ca67d467a7040c775282cdee1a
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** Thread 4,5,main]) completed. Took 0.575 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 6,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 2,5,main]) started.
This JVM does not support getting OS memory, so no OS memory status updates will be broadcast

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Not worth caching
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 2,5,main]) completed. Took 0.348 secs.
work action resolve beam-sdks-java-io-google-cloud-platform-tests.jar (project :sdks:java:io:google-cloud-platform) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 2,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is f8d50c46088f3d6c04ab90450807bfdb
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key f8d50c46088f3d6c04ab90450807bfdb
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 2,5,main]) completed. Took 0.298 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found
:runners:google-cloud-dataflow-java:testJar (Thread[included builds,5,main]) completed. Took 0.03 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project :runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 6,5,main]) started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:jdbc:compileTestJava
Custom actions are attached to task ':sdks:java:io:jdbc:compileTestJava'.
Build cache key for task ':sdks:java:io:jdbc:compileTestJava' is 529f6927d9bd06e36c99381adb4c82e7
Task ':sdks:java:io:jdbc:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':sdks:java:io:jdbc:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Invalidating in-memory cache of /home/jenkins/.gradle/caches/7.5.1/fileContent/java-modules.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/7.5.1/fileContent/annotation-processors.bin
Starting process 'Gradle Worker Daemon 45'. Working directory: /home/jenkins/.gradle/****s Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Xbootclasspath/p:/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.errorprone/javac/9+181-r4173-1/bdf4c0aa7d540ee1f7bf14d47447aea4bbf450c5/javac-9+181-r4173-1.jar -Xmx512m -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Worker Daemon 45'
Successfully started process 'Gradle Worker Daemon 45'
Started Gradle **** daemon (0.377 secs) with fork options DaemonForkOptions{executable=/usr/lib/jvm/java-8-openjdk-amd64/bin/java, minHeapSize=null, maxHeapSize=null, jvmArgs=[-Xbootclasspath/p:/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.errorprone/javac/9+181-r4173-1/bdf4c0aa7d540ee1f7bf14d47447aea4bbf450c5/javac-9+181-r4173-1.jar], keepAliveMode=SESSION}.
Compiling with JDK Java compiler API.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Invalidating in-memory cache of /home/jenkins/.gradle/caches/7.5.1/javaCompile/jarAnalysis.bin
Invalidating in-memory cache of /home/jenkins/.gradle/caches/7.5.1/javaCompile/classAnalysis.bin
Class dependency analysis for incremental compilation took 0.044 secs.
Created classpath snapshot for incremental compilation in 0.256 secs.
Stored cache entry for task ':sdks:java:io:jdbc:compileTestJava' with cache key 529f6927d9bd06e36c99381adb4c82e7
:sdks:java:io:jdbc:compileTestJava (Thread[Execution ****,5,main]) completed. Took 6.494 secs.
Resolve mutations for :sdks:java:io:jdbc:testClasses (Thread[included builds,5,main]) started.
Resolve mutations for :sdks:java:io:jdbc:testClasses (Thread[included builds,5,main]) completed. Took 0.0 secs.
:sdks:java:io:jdbc:testClasses (Thread[Execution **** Thread 2,5,main]) started.

> Task :sdks:java:io:jdbc:testClasses
Skipping task ':sdks:java:io:jdbc:testClasses' as it has no actions.
:sdks:java:io:jdbc:testClasses (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:jdbc:integrationTest (Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :sdks:java:io:jdbc:integrationTest (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:io:jdbc:integrationTest (Thread[Execution ****,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 2,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
Gradle Test Executor 46 started executing tests.

> Task :sdks:java:io:jdbc:integrationTest
Custom actions are attached to task ':sdks:java:io:jdbc:integrationTest'.
Build cache key for task ':sdks:java:io:jdbc:integrationTest' is 12f1fe7ff5b25ebd5e7016f4ef867e80
Task ':sdks:java:io:jdbc:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 46'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--numberOfRecords=5000000","--bigQueryDataset=beam_performance","--bigQueryTable=jdbcioit_results","--influxMeasurement=jdbcioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresServerName=34.66.99.208","--postgresSsl=false","--postgresPort=5432","--autoscalingAlgorithm=NONE","--numWorkers=5","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 46'
Successfully started process 'Gradle Test Executor 46'

org.apache.beam.sdk.io.jdbc.JdbcIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.43.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:89)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:97)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithAutosharding(JdbcIOIT.java:273)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 11 more

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:89)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:97)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:383)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 11 more

Gradle Test Executor 46 finished executing tests.

> Task :sdks:java:io:jdbc:integrationTest

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
    org.postgresql.util.PSQLException: The connection attempt failed.
        at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:315)
        at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
        at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225)
        at org.postgresql.Driver.makeConnection(Driver.java:465)
        at org.postgresql.Driver.connect(Driver.java:264)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:247)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:103)
        at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:87)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:89)
        at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:97)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:136)

        Caused by:
        java.net.SocketTimeoutException: connect timed out
            at java.net.PlainSocketImpl.socketConnect(Native Method)
            at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
            at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
            at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
            at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
            at java.net.Socket.connect(Socket.java:607)
            at org.postgresql.core.PGStream.createSocket(PGStream.java:231)
            at org.postgresql.core.PGStream.<init>(PGStream.java:95)
            at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98)
            at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213)
            ... 11 more

3 tests completed, 3 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>

> Task :sdks:java:io:jdbc:integrationTest FAILED
:sdks:java:io:jdbc:integrationTest (Thread[Execution ****,5,main]) completed. Took 33.962 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50s
135 actionable tasks: 80 executed, 53 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zgzozcfz5tveu

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org